AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Korean Pre-training

# Korean Pre-training

Kobert Lm
Apache-2.0
KoBERT-LM is a pre-trained language model optimized for Korean, based on the BERT architecture and further pre-trained specifically for Korean text.
Large Language Model Korean
K
monologg
49
1
Roberta Ko Small
Apache-2.0
A compact Korean RoBERTa model trained under the LASSL framework, suitable for various Korean natural language processing tasks.
Large Language Model Transformers Korean
R
lassl
17
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase